Asymptotic Post-Selection Inference for Akaike's Information Criterion

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Post - Selection Inference

It is common practice in statistical data analysis to perform datadriven variable selection and derive statistical inference from the resulting model. Such inference enjoys none of the guarantees that classical statistical theory provides for tests and confidence intervals when the model has been chosen a priori. We propose to produce valid “post-selection inference” by reducing the problem to ...

متن کامل

Valid Post-Selection Inference

It is common practice in statistical data analysis to perform data-driven variable selection and derive statistical inference from the resulting model. Such inference enjoys none of the guarantees that classical statistical theory provides for tests and confidence intervals when the model has been chosen a priori. We propose to produce valid “post-selection inference” by reducing the problem to...

متن کامل

Asymptotic Theory of Generalized Information Criterion for Geostatistical Regression Model Selection

Information criteria, such as Akaike’s information criterion and Bayesian information criterion are often applied in model selection. However, their asymptotic behaviors for selecting geostatistical regression models have not been well studied particularly under the fixed domain asymptotic framework with more and more data observed in a bounded fixed region. In this article, we study the genera...

متن کامل

A non asymptotic penalized criterion for Gaussian mixture model selection

Specific Gaussian mixtures are considered to solve simultaneously variable selection and clustering problems. A non asymptotic penalized criterion is proposed to choose the number of mixture components and the relevant variable subset. Because of the non linearity of the associated Kullback-Leibler contrast on Gaussian mixtures, a general model selection theorem for MLE proposed by Massart (200...

متن کامل

Noise derived information criterion for model selection

. This paper proposes a new complexity-penalization model selection strategy derived from the minimum risk principle and the behavior of candidate models under noisy conditions. This strategy seems to be robust in small sample size conditions and tends to AIC criterion as sample size grows up. The simulation study at the end of the paper will show that the proposed criterion is extremely compet...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: SSRN Electronic Journal

سال: 2018

ISSN: 1556-5068

DOI: 10.2139/ssrn.3167253